翻訳と辞書
Words near each other
・ "O" Is for Outlaw
・ "O"-Jung.Ban.Hap.
・ "Ode-to-Napoleon" hexachord
・ "Oh Yeah!" Live
・ "Our Contemporary" regional art exhibition (Leningrad, 1975)
・ "P" Is for Peril
・ "Pimpernel" Smith
・ "Polish death camp" controversy
・ "Pro knigi" ("About books")
・ "Prosopa" Greek Television Awards
・ "Pussy Cats" Starring the Walkmen
・ "Q" Is for Quarry
・ "R" Is for Ricochet
・ "R" The King (2016 film)
・ "Rags" Ragland
・ ! (album)
・ ! (disambiguation)
・ !!
・ !!!
・ !!! (album)
・ !!Destroy-Oh-Boy!!
・ !Action Pact!
・ !Arriba! La Pachanga
・ !Hero
・ !Hero (album)
・ !Kung language
・ !Oka Tokat
・ !PAUS3
・ !T.O.O.H.!
・ !Women Art Revolution


Dictionary Lists
翻訳と辞書 辞書検索 [ 開発暫定版 ]
スポンサード リンク

joint entropy : ウィキペディア英語版
joint entropy

In information theory, joint entropy is a measure of the uncertainty associated with a set of variables.
==Definition==
The joint Shannon entropy of two variables X and Y is defined as
:H(X,Y) = -\sum_ \sum_ P(x,y) \log_2() \!
where x and y are particular values of X and Y, respectively, P(x,y) is the joint probability of these values occurring together, and P(x,y) \log_2() is defined to be 0 if P(x,y)=0.
For more than two variables X_1, ..., X_n this expands to
:H(X_1, ..., X_n) = -\sum_ ... \sum_ P(x_1, ..., x_n) \log_2(..., x_n) ) \!
where x_1,...,x_n are particular values of X_1,...,X_n, respectively, P(x_1, ..., x_n) is the probability of these values occurring together, and P(x_1, ..., x_n) \log_2(..., x_n) ) is defined to be 0 if P(x_1, ..., x_n)=0.

抄文引用元・出典: フリー百科事典『 ウィキペディア(Wikipedia)
ウィキペディアで「joint entropy」の詳細全文を読む



スポンサード リンク
翻訳と辞書 : 翻訳のためのインターネットリソース

Copyright(C) kotoba.ne.jp 1997-2016. All Rights Reserved.